Search results for "Scatter matrix"

showing 10 items of 14 documents

Spectral density of the correlation matrix of factor models: a random matrix theory approach.

2005

We studied the eigenvalue spectral density of the correlation matrix of factor models of multivariate time series. By making use of the random matrix theory, we analytically quantified the effect of statistical uncertainty on the spectral density due to the finiteness of the sample. We considered a broad range of models, ranging from one-factor models to hierarchical multifactor models.

CombinatoricsScatter matrixCentering matrixMatrix functionStatistical physicsMultivariate t-distributionNonnegative matrixFinance Commerce correlation matrixRandom matrixSquare matrixData matrix (multivariate statistics)MathematicsPhysical review. E, Statistical, nonlinear, and soft matter physics
researchProduct

On the Computation of Symmetrized M-Estimators of Scatter

2016

This paper focuses on the computational aspects of symmetrized Mestimators of scatter, i.e. the multivariate M-estimators of scatter computed on the pairwise differences of the data. Such estimators do not require a location estimate, and more importantly, they possess the important block and joint independence properties. These properties are needed, for example, when solving the independent component analysis problem. Classical and recently developed algorithms for computing the M-estimators and the symmetrized M-estimators are discussed. The effect of parallelization is considered as well as new computational approach based on using only a subset of pairwise differences. Efficiencies and…

Computer scienceComputation05 social sciencesEstimatorMultivariate normal distributionM-estimators01 natural sciencesIndependent component analysisscatter010104 statistics & probabilityScatter matrix0502 economics and businessPairwise comparison0101 mathematicsAlgorithmIndependence (probability theory)050205 econometrics Block (data storage)
researchProduct

Image Recognition through Incremental Discriminative Common Vectors

2010

An incremental approach to the discriminative common vector (DCV) method for image recognition is presented. Two different but equivalent ways of computing both common vectors and corresponding subspace projections have been considered in the particular context in which new training data becomes available and learned subspaces may need continuous updating. The two algorithms are based on either scatter matrix eigendecomposition or difference subspace orthonormalization as with the original DCV method. The proposed incremental methods keep the same good properties than the original one but with a dramatic decrease in computational burden when used in this kind of dynamic scenario. Extensive …

Computer sciencebusiness.industryPattern recognitionContext (language use)Machine learningcomputer.software_genreAutomatic image annotationDiscriminative modelImage textureScatter matrixU-matrixComputer visionArtificial intelligencebusinesscomputerSubspace topologyFeature detection (computer vision)
researchProduct

Sign and Rank Covariance Matrices: Statistical Properties and Application to Principal Components Analysis

2002

In this paper, the estimation of covariance matrices based on multivariate sign and rank vectors is discussed. Equivariance and robustness properties of the sign and rank covariance matrices are described. We show their use for the principal components analysis (PCA) problem. Limiting efficiencies of the estimation procedures for PCA are compared.

Covariance matrixbusiness.industrySparse PCAPattern recognitionCovarianceKernel principal component analysisCorrespondence analysisScatter matrixPrincipal component analysisApplied mathematicsArtificial intelligencebusinessCanonical correlationMathematics
researchProduct

Asymptotic and bootstrap tests for subspace dimension

2022

Most linear dimension reduction methods proposed in the literature can be formulated using an appropriate pair of scatter matrices, see e.g. Ye and Weiss (2003), Tyler et al. (2009), Bura and Yang (2011), Liski et al. (2014) and Luo and Li (2016). The eigen-decomposition of one scatter matrix with respect to another is then often used to determine the dimension of the signal subspace and to separate signal and noise parts of the data. Three popular dimension reduction methods, namely principal component analysis (PCA), fourth order blind identification (FOBI) and sliced inverse regression (SIR) are considered in detail and the first two moments of subsets of the eigenvalues are used to test…

FOS: Computer and information sciencesStatistics and ProbabilityPrincipal component analysisMathematics - Statistics TheoryStatistics Theory (math.ST)01 natural sciencesMethodology (stat.ME)010104 statistics & probabilityDimension (vector space)Scatter matrixSliced inverse regression0502 economics and businessFOS: MathematicsSliced inverse regressionApplied mathematics0101 mathematicsEigenvalues and eigenvectorsStatistics - Methodology050205 econometrics MathematicsestimointiNumerical AnalysisOrder determinationDimensionality reduction05 social sciencesriippumattomien komponenttien analyysimonimuuttujamenetelmätPrincipal component analysisStatistics Probability and UncertaintySubspace topologySignal subspace
researchProduct

How to simulate normal data sets with the desired correlation structure

2010

The Cholesky decomposition is a widely used method to draw samples from multivariate normal distribution with non-singular covariance matrices. In this work we introduce a simple method by using singular value decomposition (SVD) to simulate multivariate normal data even if the covariance matrix is singular, which is often the case in chemometric problems. The covariance matrix can be specified by the user or can be generated by specifying a subset of the eigenvalues. The latter can be an advantage for simulating data sets with a particular latent structure. This can be useful for testing the performance of chemometric methods with data sets matching the theoretical conditions for their app…

Mathematical optimizationCovariance functionCovariance matrixProcess Chemistry and TechnologyMathematicsofComputing_NUMERICALANALYSISMultivariate normal distributionCovarianceComputer Science ApplicationsAnalytical ChemistryEstimation of covariance matricesScatter matrixMatrix normal distributionCMA-ESAlgorithmComputer Science::DatabasesSpectroscopySoftwareMathematicsChemometrics and Intelligent Laboratory Systems
researchProduct

On Mardia’s Tests of Multinormality

2004

Classical multivariate analysis is based on the assumption that the data come from a multivariate normal distribution. The tests of multinormality have therefore received very much attention. Several tests for assessing multinormality, among them Mardia’s popular multivariate skewness and kurtosis statistics, are based on standardized third and fourth moments. In Mardia’s construction of the affine invariant test statistics, the data vectors are first standardized using the sample mean vector and the sample covariance matrix. In this paper we investigate whether, in the test construction, it is advantageous to replace the regular sample mean vector and sample covariance matrix by their affi…

Multivariate statisticsMultivariate analysisScatter matrixStatisticsKurtosisMultivariate normal distributionAffine transformationBivariate analysisMathematicsStatistical hypothesis testing
researchProduct

Sign and rank covariance matrices

2000

The robust estimation of multivariate location and shape is one of the most challenging problems in statistics and crucial in many application areas. The objective is to find highly efficient, robust, computable and affine equivariant location and covariance matrix estimates. In this paper, three different concepts of multivariate sign and rank are considered and their ability to carry information about the geometry of the underlying distribution (or data cloud) are discussed. New techniques for robust covariance matrix estimation based on different sign and rank concepts are proposed and algorithms for computing them outlined. In addition, new tools for evaluating the qualitative and quant…

Statistics and ProbabilityCovariance functionCovariance matrixApplied MathematicsMathematicsofComputing_NUMERICALANALYSISCovariance intersectionCovarianceEstimation of covariance matricesMatérn covariance functionScatter matrixStatisticsRational quadratic covariance functionStatistics Probability and UncertaintyAlgorithmMathematicsJournal of Statistical Planning and Inference
researchProduct

The affine equivariant sign covariance matrix: asymptotic behavior and efficiencies

2003

We consider the affine equivariant sign covariance matrix (SCM) introduced by Visuri et al. (J. Statist. Plann. Inference 91 (2000) 557). The population SCM is shown to be proportional to the inverse of the regular covariance matrix. The eigenvectors and standardized eigenvalues of the covariance, matrix can thus be derived from the SCM. We also construct an estimate of the covariance and correlation matrix based on the SCM. The influence functions and limiting distributions of the SCM and its eigenvectors and eigenvalues are found. Limiting efficiencies are given in multivariate normal and t-distribution cases. The estimates are highly efficient in the multivariate normal case and perform …

Statistics and ProbabilityCovariance functionaffine equivarianceinfluence functionMultivariate normal distributionrobustnessComputer Science::Human-Computer InteractionEfficiencyestimatorsEstimation of covariance matricesScatter matrixStatisticsAffine equivarianceApplied mathematicsCMA-ESMultivariate signCovariance and correlation matricesRobustnessmultivariate medianMathematicsprincipal componentsInfluence functionNumerical AnalysisMultivariate medianCovariance matrixcovariance and correlation matricesdiscriminant-analysisCovarianceComputer Science::Otherdispersion matricesefficiencyLaw of total covariancemultivariate locationtestsStatistics Probability and Uncertaintyeigenvectors and eigenvaluesEigenvectors and eigenvaluesmultivariate signJournal of Multivariate Analysis
researchProduct

Symmetrised M-estimators of multivariate scatter

2007

AbstractIn this paper we introduce a family of symmetrised M-estimators of multivariate scatter. These are defined to be M-estimators only computed on pairwise differences of the observed multivariate data. Symmetrised Huber's M-estimator and Dümbgen's estimator serve as our examples. The influence functions of the symmetrised M-functionals are derived and the limiting distributions of the estimators are discussed in the multivariate elliptical case to consider the robustness and efficiency properties of estimators. The symmetrised M-estimators have the important independence property; they can therefore be used to find the independent components in the independent component analysis (ICA).

Statistics and ProbabilityElliptical distributionInfluence functionMultivariate statisticsNumerical AnalysisEstimatorEfficiencyM-estimatorM-estimatorIndependent component analysisEfficient estimatorScatter matrixScatter matrixMathematics::Category TheoryStatisticsApplied mathematicsStatistics Probability and UncertaintyRobustnessElliptical distributionIndependence (probability theory)MathematicsJournal of Multivariate Analysis
researchProduct